Input-dependent Regularization of Conditional Density Models

نویسنده

  • Matthias Seeger
چکیده

We emphasize the need for input-dependent regularization in the context of conditional density models (also: discriminative models) like Gaussian process predictors. This can be achieved by a simple modification of the standard Bayesian data generation model underlying these techniques. Specifically, we allow the latent target function to be apriori dependent on the distribution of the input points. While the standard generation model results in robust predictors, data with missing labels is ignored, which can be wasteful if relevant prior knowledge is available. We show that discriminative models like Fisher kernel discriminants and CoTraining classifiers can be regarded as (approximate) Bayesian inference techniques under the modified generation model, and that the template Co-Training algorithm is related to a variant of the well-known ExpectationMaximization (EM) technique. We propose a template EM algorithm for the modified generation model which can be regarded as generalization of Co-Training.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Multi-Conditional Learning: Generative/Discriminative Training for Clustering and Classification

This paper presents multi-conditional learning (MCL), a training criterion based on a product of multiple conditional likelihoods. When combining the traditional conditional probability of “label given input” with a generative probability of “input given label” the later acts as a surprisingly effective regularizer. When applied to models with latent variables, MCL combines the structure-discov...

متن کامل

ON THE STATIONARY PROBABILITY DENSITY FUNCTION OF BILINEAR TIME SERIES MODELS: A NUMERICAL APPROACH

In this paper, we show that the Chapman-Kolmogorov formula could be used as a recursive formula for computing the m-step-ahead conditional density of a Markov bilinear model. The stationary marginal probability density function of the model may be approximated by the m-step-ahead conditional density for sufficiently large m.

متن کامل

Robust Fuzzy Content Based Regularization Technique in Super Resolution Imaging

Super-resolution (SR) aims to overcome the ill-posed conditions of image acquisition. SR facilitates scene recognition from low-resolution image(s). Generally assumes that high and low resolution images share similar intrinsic geometries. Various approaches have tried to aggregate the informative details of multiple low-resolution images into a high-resolution one. In this paper, we present a n...

متن کامل

Efficient multiple hyperparameter learning for log-linear models

In problems where input features have varying amounts of noise, using distinct regularization hyperparameters for different features provides an effective means of managing model complexity. While regularizers for neural networks and support vector machines often rely on multiple hyperparameters, regularizers for structured prediction models (used in tasks such as sequence labeling or parsing) ...

متن کامل

Structured Regularizer for Neural Higher-Order Sequence Models

We introduce both neural higher-order linear-chain conditional random fields (NHO-LC-CRFs) and a new structured regularizer for these sequence models. We show that this regularizer can be derived as lower bound from a mixture of models sharing parts of each other, e.g. neural sub-networks, and relate it to ensemble learning. Furthermore, it can be expressed explicitly as regularization term in ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001